51 research outputs found

    A Two-Step Approach for Transforming Continuous Variables to Normal: Implications and Recommendations for IS Research

    Get PDF
    This article describes and demonstrates a two-step approach for transforming non-normally distributed continuous variables to become normally distributed. Step 1 involves transforming the variable into a percentile rank, which will result in uniformly distributed probabilities. The second step applies the inverse-normal transformation to the results of Step 1 to form a variable consisting of normally distributed z-scores. The approach is little-known outside the statistics literature, has been scarcely used in the social sciences, and has not been used in any IS study. The article illustrates how to implement the approach in Excel, SPSS, and SAS and explains implications and recommendations for IS research

    Tutorial of an Ontological Support System

    Get PDF
    This tutorial describes a technology-mediated solution, the ontological support system (OSS), for specifying, organizing, representing and using elements of meaning in a body of knowledge. Theoretically influenced by ontological specification, the OSS was validated through iterative prototyping in a companion CAIS research paper. This article reports on how the OSS guides users through several content analysis phases (selection, delineation, transfer and use of content) in the context of academic research. Users typically find the system to be useful, easy to use and compatible with collaborative work when using it for content analysis in academic research

    EVALUATING THE DOWNSTREAM EFFECTS OF THE TWOSTEP TRANSFORMATION TOWARD NORMALITY: IMPLICATIONS FOR MIS RESEARCH

    Get PDF
    This paper empirically evaluates the usefulness of the Two-Step approach for transforming continuous variables toward normality. The study uses 27 corporate financial performance (CFP) variables on 39,216 US corporations to compare three variable sets: 1) random-normal, 2) original, and 3) transformed toward normality using the Two-Step. The results of several statistical procedures relevant to formative index (construct) construction are used to compare the three variable forms. The results provide strong evidence that the Two-Step approach is useful for 1) achieving normality improvements in continuous variables, 2) improve sampling adequacy for factor analysis, 3) dramatically increase intercorrelations, and 4) dramatically increase main effects tests involving the CFP variables. The findings have tremendous implications for MIS research and practice, as the Two-Step technique is shown here to change effects tests significantly and consequently has profound implications for the advancement of the MIS discipline and practical applications (e.g., data mining)

    A Scientometric Investigation into the Validity of IS Journal Quality Measures

    Get PDF
    In this study we investigated the measurement validity of the findings in the IS journal quality stream over the past ten years. Our evaluation applied a series of validation tests to the metrics presented in these studies using data from multiple sources. The results of our tests for content, convergent, and discriminant validity, as well as those for parallel-form, test-retest, and item-to-total reliability, were highly supportive. From these findings, we conclude that recent studies in the IS journal quality stream are credible. As such, these IS journal quality measures provide appropriate indicators of relative journal quality. This conclusion is important for both academic administrators and scientometric researchers, the latter of whom depend on journal quality measures in the evaluation of published IS research

    Validation of a Content Analysis System Using an Iterative Prototyping Approach to Action Research

    Get PDF
    In the face of a more rapid pace of scientific development, academic societies and competitive organizations alike are seeking new methods for content analysis. This paper describes a theoretically driven action research study that delivers a technology-mediated solution for specifying, organizing, representing and using elements of meaning in a body of knowledge. The theoretical basis, \u27ontological specification\u27 is of particular interest to IS professionals, particularly those involved in analysis and design, because it guides the efficient transformation of tacit knowledge into an explicit form. The technology-mediated solution influenced by ontological specification was validated through an iterative prototyping form of action research. Users reported that the system was useful in their work, easy to use, and compatible with collaborative work when using it for content analysis in academic research

    Securing Personal Information Assets: Testing Antecedents of Behavioral Intentions

    Get PDF
    Due to the increased global reliance on information technology, and the prominence of information resources value, identity theft is a problem domain effecting millions of computer users annually. The realities of identity theft are highly visible in the global media, although empirical investigations on the topic are limited. The purpose of this study is to identify and analyze perceptions of personal information (e.g., identity) as it relates to perceived threats, mitigation, perceived risks, and intended safe information practice intentions. We propose a risk analysis model based on theoretical variables that have been researched and extensively used in both government and private sector organizations. The model is empirically tested using LISREL to perform structural equation modeling. Findings indicate support for a relationship between risk and both 1) behavioral intentions to perform safe information practices and 2) personal information asset value

    Information Technology Firms: Creating Value through Digital Disruption

    Get PDF
    Information technology (IT) firms compose the majority of the most highly valued corporations in the world based on market capitalization. To date, only Apple and Amazon—both IT companies—have reached or nearly reached a USD trillion-dollar market capitalization. The value that IT provides speaks to how managers exploit disruptive technologies to create value in both IT and non-IT firms. A panel held at the 2018 Americas Conference on Information Systems (AMCIS) discussed various ways in firms build value around IT through successful management. This paper reports on the panel discussion from a variety of perspectives, which include practitioner and researcher worldviews. This panel report also provides a sample frame that researchers can use in quantitative research involving IT firms and advocates for increased research to understand the wide range of strategies IT firms use to create value

    Abstracts from the NIHR INVOLVE Conference 2017

    Get PDF
    n/

    The development and validation of a scoring tool to predict the operative duration of elective laparoscopic cholecystectomy

    Get PDF
    Background: The ability to accurately predict operative duration has the potential to optimise theatre efficiency and utilisation, thus reducing costs and increasing staff and patient satisfaction. With laparoscopic cholecystectomy being one of the most commonly performed procedures worldwide, a tool to predict operative duration could be extremely beneficial to healthcare organisations. Methods: Data collected from the CholeS study on patients undergoing cholecystectomy in UK and Irish hospitals between 04/2014 and 05/2014 were used to study operative duration. A multivariable binary logistic regression model was produced in order to identify significant independent predictors of long (> 90 min) operations. The resulting model was converted to a risk score, which was subsequently validated on second cohort of patients using ROC curves. Results: After exclusions, data were available for 7227 patients in the derivation (CholeS) cohort. The median operative duration was 60 min (interquartile range 45–85), with 17.7% of operations lasting longer than 90 min. Ten factors were found to be significant independent predictors of operative durations > 90 min, including ASA, age, previous surgical admissions, BMI, gallbladder wall thickness and CBD diameter. A risk score was then produced from these factors, and applied to a cohort of 2405 patients from a tertiary centre for external validation. This returned an area under the ROC curve of 0.708 (SE = 0.013, p  90 min increasing more than eightfold from 5.1 to 41.8% in the extremes of the score. Conclusion: The scoring tool produced in this study was found to be significantly predictive of long operative durations on validation in an external cohort. As such, the tool may have the potential to enable organisations to better organise theatre lists and deliver greater efficiencies in care
    corecore